-Divergences and Related Distances

نویسنده

  • Igal Sason
چکیده

Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitrary pair of probability distributions on a finite set. Following bounds on the relative entropy and Jeffreys’ divergence, a tightened inequality for lossless source coding is derived and considered. Finally, a new inequality relating f -divergences is derived and studied. Index Terms – Bhattacharyya distance, Chernoff information, chi-squared divergence, f -divergence, Hellinger distance, Jeffreys’ divergence, lossless source coding, relative entropy (Kullback-Leibler divergence), total variation distance.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

IRWIN AND JOAN JACOBS CENTER FOR COMMUNICATION AND INFORMATION TECHNOLOGIES Bounds on f-Divergences and Related Distances

Derivation of tight bounds on f -divergences and related distances is of interest in information theory and statistics. This paper improves some existing bounds on f -divergences. In some cases, an alternative approach leads to a simplified proof of an existing bound. Following bounds on the chi-squared divergence, an improved version of a reversed Pinsker’s inequality is derived for an arbitra...

متن کامل

Generalizing Jensen and Bregman divergences with comparative convexity and the statistical Bhattacharyya distances with comparable means

Comparative convexity is a generalization of convexity relying on abstract notions of means. We define the (skew) Jensen divergence and the Jensen diversity from the viewpoint of comparative convexity, and show how to obtain the generalized Bregman divergences as limit cases of skewed Jensen divergences. In particular, we report explicit formula of these generalized Bregman divergences when con...

متن کامل

A generalization of the Jensen divergence: The chord gap divergence

We introduce a novel family of distances, called the chord gap divergences, that generalizes the Jensen divergences (also called the Burbea-Rao distances), and study its properties. It follows a generalization of the celebrated statistical Bhattacharyya distance that is frequently met in applications. We report an iterative concave-convex procedure for computing centroids, and analyze the perfo...

متن کامل

On the Chi Square and Higher-Order Chi Distances for Approximating -Divergences

We report closed-form formula for calculating the Chi square and higher-order Chi distances between statistical distributions belonging to the same exponential family with affine natural space, and instantiate those formula for the Poisson and isotropic Gaussian families. We then describe an analytic formula for the -divergences based on Taylor expansions and relying on an extended class of Chi...

متن کامل

On Hölder Projective Divergences

We describe a framework to build distances by measuring the tightness of inequalities, and introduce the notion of proper statistical divergences and improper pseudo-divergences. We then consider the Hölder ordinary and reverse inequalities, and present two novel classes of Hölder divergences and pseudo-divergences that both encapsulate the special case of the Cauchy-Schwarz divergence. We repo...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2014